News

Cassandra method: anticipating the risks of new technologies

31 Mar 2026

New applications can contain risks. Psychologist Sarah Diefenbach and media informatics specialist Daniel Ullrich, both from LMU, have developed a method for identifying these risks early – and thus improving new inventions.

Prof. Sarah Diefenbach

Professorin Sarah Diefenbach

explores how to deal with new technologies, their benefits, but also their potential negative consequences

Many technologies have negative consequences as well as benefits. Indeed, these harmful aspects are often built into the business model. This is the case, for example, when users find it difficult to leave a certain social media channel instead of continuing to scroll. Or when people entrust even the most personal questions and tasks to an AI, and perhaps naively adopt its suggestions, because the app manages to imitate human behavior and users establish a bond with it.

But even when developers are motivated by the best of intentions, they are not always aware of possible unwanted effects. “Every innovation has its flipside,” explains Sarah Diefenbach Professor for Economic Psychology and Human-Computer Interaction . Take the example of health apps: Health insurers want the apps to strengthen the health of their members. But if the apps turn health into a game that dishes out stars for good behavior, this could cause people to lose touch with a sense of their own body, reckons Diefenbach.

Thinking about weaknesses

Two young men are leaning over a desk covered with Cassandra Method charts.

The Cassandra method can be used to identify potential negative effects of digital technologies on users as early as the design phase. | © Sarah Diefenbach

“We can and should identify the negative effects of innovative technology from an early stage so as to be able to prevent them,” says Sarah Diefenbach. Precisely to this end, the LMU psychologist joined forces with LMU media informatics specialist Daniel Ullrich to develop the so-called Cassandra method.

This encourages participants to engage in critical reflection in a playful manner, with the goal of rendering side effects visible during the design process. Instead of focusing eagerly on the opportunities and benefits of new technologies alone, it is about reflecting on their possible weaknesses as well, looking at the attendant risks and side effects, and improving product design accordingly.

The method is named for the character from Greek mythology whose warnings went tragically unheeded. Millennia later, people still find it difficult to take criticism on board. This can have far-reaching consequences when it comes to the development of new technologies.

This is where the Cassandra method comes in. It allows people to adopt a critical perspective in a playful context and systematically analyze potential negative consequences. It works best in teams of three to ten people, who come together for a half-day workshop. During a brainstorming phase, participants collect as many possible side effects of a technology as they can think of. They work here with cards that suggest various perspectives, helping them to envision possible risks.

For example, somebody playing the role of a stalker could consider how to exploit an app to spy on others. A saboteur could look at how to manipulate a system. An unscrupulous investor could think about how to make as much money as possible from a product, while happy to accept even serious negative consequences.

The Cassandra method can be used to identify potential negative effects of digital technologies on users as early as the design phase.

© Sarah Diefenbach

Weighting scenarios

In the subsequent assessment phase, the collected scenarios are classified and differentially weighted: How severe is a possible damage? How likely is it to occur? And what can be done to contain the risks?

A workshop with Munich start-up “Ambit – Our Sports” illustrates how this works in practice. Co-founder Philipp Tschochohei developed an app that allows people to spontaneously meet up in public spaces to do team sport. The project was supported at LMU through the EXIST start-up program.

The primary motivation of the founders was to foster community with the app. “We thought about how to bring people together at sporting facilities,” recalls Tschochohei. “But that meeting up in a remote park late in the evening can have risks, and that people do not like to be stalked – these were aspects we did not seriously consider at the beginning.”

For schoolchildren, too, meeting up with strangers to do sport in the afternoon can be problematic. “Although these things did cross our mind,” says Tschochohei, “they did not lead to the creation of work packages for development.” With the help of the Cassandra method, these possible negative aspects are systematically recorded and then factored into the design.

Blind spot in development

There are various reasons why possible negative consequences are often overlooked. One of them is the psychological concept of confirmation bias, our tendency to mainly perceive information that confirms our own convictions. Conflicting information, by contrast, is all the more easily suppressed.

On top of this, there is people’s enthusiasm for their own idea. Developers are often strongly technology-minded and convinced of their innovation. When people think like this, criticism can seem like an obstacle that is being put in their way. “Every solution contains some poison and can be the source of new problems – but this is not something that engineers and start-up entrepreneurs like to hear,” says Daniel Ullrich. “They’ve got a blind spot there.”

Criticize first, then build

In a workshop, teams use this box as part of the Cassandra method to collaborate and identify critical issues in the development of their technology. | © Sarah Diefenbach

In many areas of life, we appreciate the importance of considering possible consequences at an early stage. In clinical studies, for example, ethical guidelines have to be observed. This involves systematically asking who could be harmed. In technology development, by contrast, visions of positive benefits often dominate – in the spirit of build first, fix later.

The Cassandra method tries to make such blind spots visible. It gets teams to consciously take up a counterposition and ask: What negative side effects could a technology have? How could it be misused? What happens when it comes into contact with particularly vulnerable user groups?

Diefenbach sees great potential here. The method could be employed everywhere that technological innovation takes place, and help teams identify risks early during product development.

“Naming possible negative consequences of a new technology, incorporating criticism from an early stage, and taking this criticism into account during development” says Diefenbach, “that would be the ideal.” At a human-computer interaction conference in Sydney, the Cassandra researchers received the Best Paper Award. Now the team is working on making the method available on a website.

Publication:

Ullrich, D., Bischoff, E. & Diefenbach, S. (2025). The Cassandra Method: Using Dystopian Visions to Inform Responsible HCI Design and Evaluation. In: Proceedings of the 37th Australian Conference on Human-Computer Interaction (OZCHI '25), 505 – 529.


Sarah Diefenbach, Daniel Ullrich: The Cassandra Method: Dystopian Visions as a Basis für Responsible Design. In: Engineering Proceedings 2024.

What are you looking for?